Multi-Step Stochastic ADMM in High Dimensions: Applications to Sparse Optimization and Matrix Decomposition
نویسندگان
چکیده
In this paper, we consider a multi-step version of the stochastic ADMM method with efficient guarantees for high-dimensional problems. We first analyze the simple setting, where the optimization problem consists of a loss function and a single regularizer (e.g. sparse optimization), and then extend to the multi-block setting with multiple regularizers and multiple variables (e.g. matrix decomposition into sparse and low rank components). For the sparse optimization problem, our method achieves the minimax rate of O(s log d/T ) for s-sparse problems in d dimensions in T steps, and is thus, unimprovable by any method up to constant factors. For the matrix decomposition problem with a general loss function, we analyze the multi-step ADMM with multiple blocks. We establish O(1/T ) rate and efficient scaling as the size of matrix grows. For natural noise models (e.g. independent noise), our convergence rate is minimax-optimal. Thus, we establish tight convergence guarantees for multi-block ADMM in high dimensions. Experiments show that for both sparse optimization and matrix decomposition problems, our algorithm outperforms the state-of-the-art methods.
منابع مشابه
Multi-Step Stochastic ADMM in High Dimensions: Applications to Sparse Optimization and Noisy Matrix Decomposition
We propose an efficient ADMM method with guarantees for high-dimensional problems. We provide explicit bounds for the sparse optimization problem and the noisy matrix decomposition problem. For sparse optimization, we establish that the modified ADMM method has an optimal regret bound of O(s log d/T ), where s is the sparsity level, d is the data dimension and T is the number of steps. This mat...
متن کاملMulti-Step Stochastic ADMM in High Dimensions: Applications in Sparse Optimization and Noisy Matrix Decomposition
We propose an efficient ADMM method with guarantees for high-dimensional problems. We provide explicit bounds for the sparse optimization problem and the noisy matrix decomposition problem. For sparse optimization, we establish that the modified ADMM method has an optimal regret bound of O(s log d/T ), where s is the sparsity level, d is the data dimension and T is the number of steps. This mat...
متن کاملNoisy matrix decomposition via convex relaxation: Optimal rates in high dimensions
We analyze a class of estimators based on convex relaxation for solving high-dimensional matrix decomposition problems. The observations are noisy realizations of a linear transformation X of the sum of an (approximately) low rank matrix Θ⋆ with a second matrix Γ⋆ endowed with a complementary form of low-dimensional structure; this set-up includes many statistical models of interest, including ...
متن کاملScalable Stochastic Alternating Direction Method of Multipliers
Alternating direction method of multipliers (ADMM) has been widely used in many applications due to its promising performance to solve complex regularization problems and large-scale distributed optimization problems. Stochastic ADMM, which visits only one sample or a mini-batch of samples each time, has recently been proved to achieve better performance than batch ADMM. However, most stochasti...
متن کاملAn ADMM Solution to the Sparse Coding Problem
For our project, we apply the method of the alternating direction of multipliers and sequential convex optimization to sparse coding of images. The motivation behind sparse coding of images is to model how the brain is able to efficiently utilize the human visual system for a variety of tasks, such as separating a car from a background, as well as general classification tasks. Sparse coding aim...
متن کامل